🇬🇧 en pl 🇵🇱

Markov chain noun

  • (probability theory) A discrete-time stochastic process containing a Markov property.
łańcuch Markowa
Wiktionary Links